Deriving a New Divergence Measure from Extended Cross-Entropy Error Function
نویسندگان
چکیده
منابع مشابه
Cross Entropy Error Function in Neural Networks: Forecasting Gasoline Demand
This paper applies artificial neural networks to forecast gasoline consumption. The ANN is implemented using the cross entropy error function in the training stage. The cross entropy function is proven to accelerate the backpropagation algorithm and to provide good overall network performance with relatively short stagnation periods. To forecast gasoline consumption (GC), the ANN uses previous ...
متن کاملA New Approach of Deriving Bounds between Entropy and Error from Joint Distribution: Case Study for Binary Classifications
The existing upper and lower bounds between entropy and error are mostly derived through an inequality means without linking to joint distributions. In fact, from either theoretical or application viewpoint, there exists a need to achieve a complete set of interpretations to the bounds in relation to joint distributions. For this reason, in this work we propose a new approach of deriving the bo...
متن کاملImproving Error Back Propagation Algorithm by using Cross Entropy Error Function and Adaptive Learning Rate
Improving the efficiency and convergence rate of the Multilayer Backpropagation Neural Network Algorithms is an important area of research. The last researches have witnessed an increasing attention to entropy based criteria in adaptive systems. Several principles were proposed based on the maximization or minimization of cross entropy function. One way of entropy criteria in learning systems i...
متن کاملDeriving the Qubit from Entropy Principles
We provide an axiomatization of the simplest quantum system, namely the qubit, based on entropic principles. Specifically, we show: The qubit can be derived from the set of maximumentropy probabilities that satisfy an entropic version of the Heisenberg uncertainty principle. Our formulation is in phase space (following Wigner [41]) and makes use of Rényi [32] entropy (which includes Shannon [33...
متن کاملCross-entropy measure of uncertain variables
ross-entropy is a measure of the difference between two distribution functions. In order to deal with the divergence of uncertain variables via uncertainty distributions, this paper aims at introducing the concept of cross-entropy for uncertain variables based on uncertain theory, as well as investigating some mathematical properties of this concept. Several practical examples are also provided...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: International Journal of Contents
سال: 2015
ISSN: 1738-6764
DOI: 10.5392/ijoc.2015.11.2.057